Modern phones are being equipped with numerous sensors such as cameras, microphones, GPS, accelerometers, and health monitors. This project aims to design a 'Virtual Information Telescope', where the 'lenses' of the telescope are metaphors for the sensors in people's mobile phones. Using such a telescope, an Internet user will be able to zoom into any part of the human-populated world, and observe events of interest. Users will be able to direct queries to phones located in a given region, and receive real-time responses through automatic sensing or explicit human participation. Example domains that may benefit from this platform include education, healthcare, tourism, disaster management, environment conservation, and social collaborations. Perhaps more fundamentally, a virtual information telescope may change the way we browse, query, learn, and process information.
This talk will expand on this vision, and instantiate it through a live system called 'Micro-Blog' (see project webpage at http://synrg.ee.duke.edu/microblog.html). We will discuss a suite of important research challenges underlying the translation of Micro-Blog into a deployable/usable system. Of particular interest are topics on energy-efficient localization, sensor-augmented context identification, and location-privacy. We will close with some thoughts on what lies ahead along the path of mobile, social computing.